2 research outputs found

    Inferring Mood-While-Eating with Smartphone Sensing and Community-Based Model Personalization

    Full text link
    The interplay between mood and eating has been the subject of extensive research within the fields of nutrition and behavioral science, indicating a strong connection between the two. Further, phone sensor data have been used to characterize both eating behavior and mood, independently, in the context of mobile food diaries and mobile health applications. However, limitations within the current body of literature include: i) the lack of investigation around the generalization of mood inference models trained with passive sensor data from a range of everyday life situations, to specific contexts such as eating, ii) no prior studies that use sensor data to study the intersection of mood and eating, and iii) the inadequate examination of model personalization techniques within limited label settings, as we commonly experience in mood inference. In this study, we sought to examine everyday eating behavior and mood using two datasets of college students in Mexico (N_mex = 84, 1843 mood-while-eating reports) and eight countries (N_mul = 678, 329K mood reports incl. 24K mood-while-eating reports), containing both passive smartphone sensing and self-report data. Our results indicate that generic mood inference models decline in performance in certain contexts, such as when eating. Additionally, we found that population-level (non-personalized) and hybrid (partially personalized) modeling techniques were inadequate for the commonly used three-class mood inference task (positive, neutral, negative). Furthermore, we found that user-level modeling was challenging for the majority of participants due to a lack of sufficient labels and data from the negative class. To address these limitations, we employed a novel community-based approach for personalization by building models with data from a set of similar users to a target user

    Sensing Eating Events in Context: A Smartphone-Only Approach

    Full text link
    While the task of automatically detecting eating events has been examined in prior work using various wearable devices, the use of smartphones as standalone devices to infer eating events remains an open issue. This paper proposes a framework that infers eating vs. non-eating events from passive smartphone sensing and evaluates it on a dataset of 58 college students. First, we show that time of the day and features from modalities such as screen usage, accelerometer, app usage, and location are indicative of eating and non-eating events. Then, we show that eating events can be inferred with an AUROC (area under the receiver operating characteristics curve) of 0.65 using subject-independent machine learning models, which can be further improved up to 0.81 for subject-dependent and 0.81 for hybrid models using personalization techniques. Moreover, we show that users have different behavioral and contextual routines around eating episodes requiring specific feature groups to train fully personalized models. These findings are of potential value for future mobile food diary apps that are context-aware by enabling scalable sensing-based eating studies using only smartphones; detecting under-reported eating events, thus increasing data quality in self report-based studies; providing functionality to track food consumption and generate reminders for on-time collection of food diaries; and supporting mobile interventions towards healthy eating practices.Comment: Accepted for publication at IEEE Acces
    corecore